21 research outputs found

    A software framework for computing Newton polytopes of resultants and (reduced) discriminants

    Get PDF
    We present a new software for computing Newton polytopes of resultant and discriminant polynomials. We illustrate its use with a number of examples

    The maximum number of faces of the Minkowski sum of three convex polytopes

    Get PDF
    We derive tight expressions for the maximum number of kk-faces, 0kd10\le{}k\le{}d-1, of the Minkowski sum, P1+P2+P3P_1+P_2+P_3, of three dd-dimensional convex polytopes P1P_1, P2P_2 and P3P_3 in Rd\reals^d, as a function of the number of vertices of the polytopes, for any d2d\ge{}2. Expressing the Minkowski sum as a section of the Cayley polytope C\mathcal{C} of its summands, counting the kk-faces of P1+P2+P3P_1+P_2+P_3 reduces to counting the (k+2)(k+2)-faces of C\mathcal{C} which meet the vertex sets of the three polytopes. In two dimensions our expressions reduce to known results, while in three dimensions, the tightness of our bounds follows by exploiting known tight bounds for the number of faces of rr dd-polytopes in Rd\reals^d, where rdr\ge d. For d4d\ge{}4, the maximum values are attained when P1P_1, P2P_2 and P3P_3 are dd-polytopes, whose vertex sets are chosen appropriately from three distinct dd-dimensional moment-like curves

    Sparse implicitization by interpolation: Characterizing non-exactness and an application to computing discriminants

    Get PDF
    We revisit implicitization by interpolation in order to examine its properties in the context of sparse elimination theory. Based on the computation of a superset of the implicit support, implicitization is reduced to computing the nullspace of a numeric matrix. The approach is applicable to polynomial and rational parameterizations of curves and (hyper)surfaces of any dimension, including the case of parameterizations with base points. Our support prediction is based on sparse (or toric) resultant theory, in order to exploit the sparsity of the input and the output. Our method may yield a multiple of the implicit equation: we characterize and quantify this situation by relating the nullspace dimension to the predicted support and its geometry. In this case, we obtain more than one multiples of the implicit equation; the latter can be obtained via multivariate polynomial gcd (or factoring). All of the above techniques extend to the case of approximate computation, thus yielding a method of sparse approximate implicitization, which is important in tackling larger problems. We discuss our publicly available Maple implementation through several examples, including the benchmark of bicubic surface. For a novel application, we focus on computing the discriminant of a multivariate polynomial, which characterizes the existence of multiple roots and generalizes the resultant of a polynomial system. This yields an efficient, output-sensitive algorithm for computing the discriminant polynomial

    An Output-sensitive Algorithm for Computing Projections of Resultant Polytopes

    Get PDF
    We develop an incremental algorithm to compute the Newton polytope of the resultant, aka resultant polytope, or its projection along a given direction. The resultant is fundamental in algebraic elimination and in implicitization of parametric hypersurfaces. Our algorithm exactly computes vertex- and halfspace-representations of the desired polytope using an oracle producing resultant vertices in a given direction. It is output-sensitive as it uses one oracle call per vertex. We overcome the bottleneck of determinantal predicates by hashing, thus accelerating execution from 1818 to 100100 times. We implement our algorithm using the experimental CGAL package {\tt triangulation}. A variant of the algorithm computes successively tighter inner and outer approximations: when these polytopes have, respectively, 90\% and 105\% of the true volume, runtime is reduced up to 2525 times. Our method computes instances of 55-, 66- or 77-dimensional polytopes with 3535K, 2323K or 500500 vertices, resp., within 22hr. Compared to tropical geometry software, ours is faster up to dimension 55 or 66, and competitive in higher dimensions

    Implicitization of curves and (hyper)surfaces using predicted support

    Get PDF
    We reduce implicitization of rational planar parametric curves and (hyper)surfaces to linear algebra, by interpolating the coefficients of the implicit equation. For predicting the implicit support, we focus on methods that exploit input and output structure in the sense of sparse (or toric) elimination theory, namely by computing the Newton polytope of the implicit polynomial, via sparse resultant theory. Our algorithm works even in the presence of base points but, in this case, the implicit equation shall be obtained as a factor of the produced polynomial. We implement our methods on Maple, and some on Matlab as well, and study their numerical stability and efficiency on several classes of curves and surfaces. We apply our approach to approximate implicitization, and quantify the accuracy of the approximate output, which turns out to be satisfactory on all tested examples; we also relate our measures to Hausdorff distance. In building a square or rectangular matrix, an important issue is (over)sampling the given curve or surface: we conclude that unitary complexes offer the best tradeoff between speed and accuracy when numerical methods are employed, namely SVD, whereas for exact kernel computation random integers is the method of choice. We compare our prototype to existing software and find that it is rather competitive

    Sparse implicitization by interpolation: Geometric computations using matrix representations

    Full text link
    Based on the computation of a superset of the implicit support, implicitization of a parametrically given hyper-surface is reduced to computing the nullspace of a numeric matrix. Our approach exploits the sparseness of the given parametric equations and of the implicit polynomial. In this work, we study how this interpolation matrix can be used to reduce some key geometric predicates on the hyper-surface to simple numerical operations on the matrix, namely membership and sidedness for given query points. We illustrate our results with examples based on our Maple implementation

    Interpolation of syzygies for implicit matrix representations

    Get PDF
    We examine matrix representations of curves and surfaces based on syzygies and constructed by interpolation through points. They are implicit representations of objects given as point clouds. The corresponding theory, including moving lines, curves and surfaces, has been developed for parametric models. Our contribution is to show how to compute the required syzygies by interpolation, when the geometric object is given by a point cloud whose sampling satisfies mild assumptions. We focus on planar and space curves, where the theory of syzygies allows us to design an exact algorithm yielding the optimal implicit expression. The method extends readily to surfaces without base points defined over triangular patches. Our Maple implementation has served to produce the examples in this paper and is available upon demand by the authors
    corecore